Definition of Markov Chain

  • 1. A Markov process for which the parameter is discrete time values Noun

Synonyms for word "Markov chain"

Semanticaly linked words with "Markov chain"

Hyponims for word "Markov chain"